Matrix Nearness Problems with Bregman Divergences

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Matrix Nearness Problems with Bregman Divergences

This paper discusses a new class of matrix nearness problems that measure approximation error using a directed distance measure called a Bregman divergence. Bregman divergences offer an important generalization of the squared Frobenius norm and relative entropy, and they all share fundamental geometric properties. In addition, these divergences are intimately connected with exponential families...

متن کامل

Generalized Nonnegative Matrix Approximations with Bregman Divergences

Nonnegative matrix approximation (NNMA) is a recent technique for dimensionality reduction and data analysis that yields a parts based, sparse nonnegative representation for nonnegative input data. NNMA has found a wide variety of applications, including text analysis, document clustering, face/image recognition, language modeling, speech processing and many others. Despite these numerous appli...

متن کامل

Mining Matrix Data with Bregman Matrix Divergences for Portfolio Selection

If only we always knew ahead of time.... The dream of any stock portfolio manager 1 is to allocate stocks in his portfolio in hindsight so as to always reach maximum 2 wealth. With hindsight, over a given time period, the best strategy is to invest into 3 the best performing stock over that period. However, even this appealing strategy is 4 not without regret. Reallocating everyday to the best ...

متن کامل

Clustering with Bregman Divergences

A wide variety of distortion functions, such as squared Euclidean distance, Mahalanobis distance, Itakura-Saito distance and relative entropy, have been used for clustering. In this paper, we propose and analyze parametric hard and soft clustering algorithms based on a large class of distortion functions known as Bregman divergences. The proposed algorithms unify centroid-based parametric clust...

متن کامل

Low-Rank Kernel Learning with Bregman Matrix Divergences

In this paper, we study low-rank matrix nearness problems, with a focus on learning lowrank positive semidefinite (kernel) matrices for machine learning applications. We propose efficient algorithms that scale linearly in the number of data points and quadratically in the rank of the input matrix. Existing algorithms for learning kernel matrices often scale poorly, with running times that are c...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Matrix Analysis and Applications

سال: 2008

ISSN: 0895-4798,1095-7162

DOI: 10.1137/060649021